Bidiagonalization as a fundamental decomposition of data in linear approximation problems
نویسنده
چکیده
• First, the lower bidiagonal matrix A11 with nonzero bidiagonal elements has full column rank and its singular values are simple. Consequently, any zero singular values or repeats that A has must appear in A22. • Second, A11 has minimal dimensions, and A22 has maximal dimensions, over all orthogonal transformations giving the block structure in (2), without any additional assumptions on the structure of A11 and b1.
منابع مشابه
Some properties of LSQR for large sparse linear least squares problems
It is well-known that many Krylov solvers for linear systems, eigenvalue problems, and singular value decomposition problems have very simple and elegant formulas for residual norms. These formulas not only allow us to further understand the methods theoretically but also can be used as cheap stopping criteria without forming approximate solutions and residuals at each step before convergence t...
متن کاملInverse subspace problems with applications
Given a square matrix A, the inverse subspace problem is concerned with determining a closest matrix to A with a prescribed invariant subspace. When A is Hermitian, the closest matrix may be required to be Hermitian. We measure distance in the Frobenius norm and discuss applications to Krylov subspace methods for the solution of large-scale linear systems of equations and eigenvalue problems as...
متن کاملDivide and Conquer Low-rank Preconditioning Techniques
This paper presents a preconditioning method based on a recursive multilevel lowrank approximation approach. The basic idea is to recursively divide the problem into two and apply a low-rank approximation to a matrix obtained from the Sherman-Morrison formula. The low-rank approximation may be computed by the partial Singular Value Decomposition (SVD) or it can be approximated by the Lanczos bi...
متن کاملExtended Lanczos Bidiagonalization for Dimension Reduction in Information Retrieval
We describe an extended bidiagonalization scheme designed to compute low-rank approximations of very large data matrices. Its goal is identical to that of the truncated singular value decomposition, but it is significantly cheaper. It consists in an extension of the standard Lanczos bidiagonalization that improves its approximation capabilities, while keeping the computational cost reasonable. ...
متن کاملComputing Projections with Lsqr*
LSQR uses the Golub-Kahan bidiagonalization process to solve sparse least-squares problems with and without regularization. In some cases, projections of the right-hand side vector are required, rather than the least-squares solution itself. We show that projections may be obtained from the bidiagonalization as linear combinations of (theoretically) orthogonal vectors. Even the least-squares so...
متن کامل